Definition of a Computer - before 1935, a computer was a person who performed arithmetic calculations. Between 1935 and 1945 the definition referred to a machine, rather than a person. The modern machine definition is based on von Neumann's concepts: a device that accepts input, processes data, stores data, and produces output. We have gone from the vacuum tube to the transistor, to the microchip. Then the microchip started talking to the modem. Now we exchange text, sound, photos and movies in a digital environment.
Basic Computing
Moore's Law - The number of transistors on integrated circuits doubles every two years; as time goes on, it becomes less expensive to produce more powerful computers.
Computers use "boolean algebra", a system of manipulating variables with 1 of 2 values. (e.g. true/false, on/off, high/low)
Common boolean operators- and, or, not
Basic unit of computing is called a "bit", computers use these bits comprised of 1's and 0
's to perform functions
There are 8 bits in a byte, which can represent 256 values from 0 to 255.
Computers use combinational and sequantial circuits to function
half adder - adds two one-bit binary numbers
full adder - adds binary numbers and any other numbers along with them
ASCII - American Standard Code For Information Interchange, uses 7 bits
binary values 0-127 given a value
128-255 are accented characters from foreign languages
https://canvas.instructure.com/courses/739593/wiki/week-1-keywords?module_item_id=4202119
We have gone from the vacuum tube to the transistor, to the microchip. Then the microchip started talking to the modem. Now we exchange text, sound, photos and movies in a digital environment.
Basic Computing
's to perform functions